On Convex Optimization, Fat Shattering and Learning

نویسندگان

  • Nathan Srebro
  • Karthik Sridharan
چکیده

Oracle complexity of the problem under the oracle based optimization model introduced by Nemirovski & Yudin (1978) is considered. We show that the oracle complexity can be lower bounded by fat-shattering dimension introduced by Kearns & Schapire (1990), a key tool in learning theory. Using this result, we proceed to establish upper bounds on learning rates for agnostic PAC learning with linear predictors in terms of oracle complexity thus showing an inherent relationship between learning and convex optimization.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Size of Convex Hulls of Small Sets

We investigate two di erent notions of \size" which appear naturally in Statistical Learning Theory. We present quantitative estimates on the fat-shattering dimension and on the covering numbers of convex hulls of sets of functions, given the necessary data on the original sets. The proofs we present are relatively simple since they do not require extensive background in convex geometry.

متن کامل

Rademacher averages and phase transitions in Glivenko-Cantelli classes

We introduce a new parameter which may replace the fat-shattering dimension. Using this parameter we are able to provide improved complexity estimates for the agnostic learning problem with respect to any norm. Moreover, we show that if fat ( ) = ( ) then displays a clear phase transition which occurs at = 2. The phase transition appears in the sample complexity estimates, covering numbers esti...

متن کامل

Learning From An Optimization Viewpoint

Optimization has always played a central role in machine learning and advances in the field of optimization and mathematical programming have greatly influenced machine learning models. However the connection between optimization and learning is much deeper : one can phrase statistical and online learning problems directly as corresponding optimization problems. In this dissertation I take this...

متن کامل

Entropy and the Shattering Dimension

The Shattering dimension of a class is a real-valued version of the Vapnik-Chervonenkis dimension. We will present a solution to Talagrand’s entropy problem, showing that the L2-covering numbers of every uniformly bounded class of functions are exponential in the shattering dimension of the class. Formally we prove that there are absolute constants K and c such that for every 0 < t ≤ 1 and any ...

متن کامل

Bounding the Fat Shattering Dimension of a Composition Function Class Built Using a Continuous Logic Connective

We begin this report by describing the Probably Approximately Correct (PAC) model for learning a concept class, consisting of subsets of a domain, and a function class, consisting of functions from the domain to the unit interval. Two combinatorial parameters, the Vapnik-Chervonenkis (VC) dimension and its generalization, the Fat Shattering dimension of scale ǫ, are explained and a few examples...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013